Multimodal Tucker Decomposition for Gated RBM Inference

نویسندگان

چکیده

Gated networks are that contain gating connections in which the output of at least two neurons multiplied. The basic idea a gated restricted Boltzmann machine (RBM) model is to use binary hidden units learn conditional distribution one image (the output) given another input). This allows RBM transformations between successive images. Inference consists extracting pair However, fully connected multiplicative network creates cubically many parameters, forming three-dimensional interaction tensor requires lot memory and computations for inference training. In this paper, we parameterize bilinear interactions through multimodal tensor-based Tucker decomposition. decomposition decomposes into set matrices (usually smaller) core tensor. parameterization helps reduce number reduces computational costs learning process effectively strengthens structured feature learning. When trained on affine still images, show how completely unsupervised learns explicit encodings transformations.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Equivariant and scale-free Tucker decomposition models

Analyses of array-valued datasets often involve reduced-rank array approximations, typically obtained via least-squares or truncations of array decompositions. However, least-squares approximations tend to be noisy in high-dimensional settings, and may not be appropriate for arrays that include discrete or ordinal measurements. This article develops methodology to obtain low-rank model-based re...

متن کامل

Some Theory on Non-negative Tucker Decomposition

Some theoretical difficulties that arise from dimensionality reduction for tensors with non-negative coefficients is discussed in this paper. A necessary and sufficient condition is derived for a low nonnegative rank tensor to admit a non-negative Tucker decomposition with a core of the same non-negative rank. Moreover, we provide evidence that the only algorithm operating mode-wise, minimizing...

متن کامل

Alternating proximal gradient method for sparse nonnegative Tucker decomposition

Multi-way data arises inmany applications such as electroencephalography classification, face recognition, text mining and hyperspectral data analysis. Tensor decomposition has been commonly used to find the hidden factors and elicit the intrinsic structures of the multi-way data. This paper considers sparse nonnegative Tucker decomposition (NTD), which is to decompose a given tensor into the p...

متن کامل

Infinite Tucker Decomposition: Nonparametric Bayesian Models for Multiway Data Analysis

Tensor decomposition is a powerful computational tool for multiway data analysis. Many popular tensor decomposition approaches—such as the Tucker decomposition and CANDECOMP/PARAFAC (CP)—amount to multi-linear factorization. They are insufficient to model (i) complex interactions between data entities, (ii) various data types (e.g.missing data and binary data), and (iii) noisy observations and ...

متن کامل

Gated Multimodal Units for Information Fusion

This paper presents a novel model for multimodal learning based on gated neu-ral networks. The Gated Multimodal Unit (GMU) model is intended to be used as an internal unit in a neural network architecture whose purpose is to find an intermediate representation based on a combination of data from different modalities. The GMU learns to decide how modalities influence the activation of the unit u...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Applied sciences

سال: 2021

ISSN: ['2076-3417']

DOI: https://doi.org/10.3390/app11167397